Incorporating Type II Error Probabilities from Independence Tests into Score-Based Learning of Bayesian Network Structure

نویسندگان

  • Eliot Brenner
  • David Sontag
چکیده

We give a new consistent scoring function for structure learning of Bayesian networks. In contrast to traditional approaches to score-based structure learning, such as BDeu or MDL, the complexity penalty that we propose is data-dependent and is given by the probability that a conditional independence test correctly shows that an edge cannot exist. What really distinguishes this new scoring function from earlier work is that it has the property of becoming computationally easier to maximize as the amount of data increases. We prove a polynomial sample complexity result, showing that maximizing this score is guaranteed to correctly learn a structure with no false edges and a distribution close to the generating distribution, whenever there exists a Bayesian network which is a perfect map for the data generating distribution. Although the new score can be used with any search algorithm, in [BS13] we have given empirical results showing that it is particularly effective when used together with a linear programming relaxation approach to Bayesian network structure learning. The present paper contains all details of the proofs of the finite-sample complexity results in [BS13] as well as detailed explanation of the computation of the certain error probabilities called β-values, whose precomputation and tabulation is necessary for the implementation of the algorithm in [BS13]. May 13, 2015 ar X iv :1 50 5. 02 87 0v 1 [ cs .L G ] 1 2 M ay 2 01 5

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian and Decision Models in AI 2010-2011 Assignment II – Learning Bayesian Networks

The purpose of this assignment is to test and possibly expand your knowledge about learning Bayesian networks from data. Recall that learning Bayesian networks involves both structure learning, i.e., learning the graph topology from data, and parameter learning, i.e., learning the actual, local probability distributions from data. There are basically two approaches to structure learning: (i) se...

متن کامل

An Introduction to Inference and Learning in Bayesian Networks

Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...

متن کامل

Bayesian Networks 2013–2014 Assignment II – Learning Bayesian Networks

The purpose of this assignment is to test and possibly expand your knowledge about learning Bayesian networks from data, by exploring various issues such as comparison of learning algorithms, dealing with missing data and evaluation of the networks learned. Recall that learning Bayesian networks involves both structure learning, i.e., learning the graph topology from data, and parameter learnin...

متن کامل

Bayesian and Decision Models in AI 2011-2012 Assignment II – Learning Bayesian Networks

The purpose of this assignment is to test and possibly expand your knowledge about learning Bayesian networks from data, by exploring various issues such as comparison of learning algorithms, dealing with missing data and evaluation of the networks learned. Recall that learning Bayesian networks involves both structure learning, i.e., learning the graph topology from data, and parameter learnin...

متن کامل

Bayesian and Decision Models in AI 2009-2010 Assignment II – Learning Bayesian Networks

Search-and-score algorithms search for a Bayesian network structure that fits the data best (in some sense). They start with an initial network structure (often a graph without arcs or a complete graph), and then traverse the search space of network structures by in each step either removing an arc, adding an arc, or reversing an arc. Read again the paper by Castello and Kočka [1] for a good ov...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1505.02870  شماره 

صفحات  -

تاریخ انتشار 2015